From information to probability: An axiomatic approach - Inference isinformation processing

نویسندگان

  • Wilhelm Rödder
  • Gabriele Kern-Isberner
چکیده

We define the very rich language of composed conditionals on a three-valued logic and use this language as the communication tool between man and machine. Communication takes place for three reasons: Knowledge acquisition, query and response. Learning, thinking and answering questions are of pure information theoretical nature. The pivot of this knowledge processing concept is the amount of information in [bit] which we receive, if we learn a conditional to become true. We follow an axiomatic approach to information theory rather than the classical probabilistic approach of Shannon; information comes first, then comes probability. In the light of this philosophy query and response experience new interpretations. Both, acquisition and response are realized by maximizing entropy and minimizing relative entropy, respectively. The iterative solution of these mathematical optimization problems gives new insights into the adaptation of prior knowledge to new information. The expert system shell SPIRIT supports this kind of knowledge processing which will be demonstrated by suitable examples.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Integrating Fuzzy Inference System, Image Processing and Quality Control to Detect Defects and Classify Quality Level of Copper Rods

Human-based quality control reduces the accuracy of this process. Also, the speed of decision making in some industries is very important. For removing these limitations in human-based quality control, in this paper, the design of an expert system for automatic and intelligent quality control is investigated. In fact, using an intelligent system, the accuracy in quality control is increased. It...

متن کامل

Axiomatic Characterizations of Information Measures

Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N -dimensional rand...

متن کامل

Quantum Information Processing Theory

Definition Quantum information processing theory is an alternative mathematical approach for generating theories of how an observer processes information. Typically, quantum information processing models are derived from the axiomatic principles of quantum probability theory. This probability theory may be viewed as a generalization of classic probability. Quantum information processing models ...

متن کامل

Maximum Entropy and Maximum Probability

Sanov’s Theorem and the Conditional Limit Theorem (CoLT) are established for a multicolor Pólya Eggenberger urn sampling scheme, giving the Pólya divergence and the Pólya extension to the Maximum Relative Entropy (MaxEnt) method. Pólya MaxEnt includes the standard MaxEnt as a special case. The universality of standard MaxEnt advocated by an axiomatic approach to inference for inverse problems i...

متن کامل

EXTRACTION-BASED TEXT SUMMARIZATION USING FUZZY ANALYSIS

Due to the explosive growth of the world-wide web, automatictext summarization has become an essential tool for web users. In this paperwe present a novel approach for creating text summaries. Using fuzzy logicand word-net, our model extracts the most relevant sentences from an originaldocument. The approach utilizes fuzzy measures and inference on theextracted textual information from the docu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Int. J. Intell. Syst.

دوره 18  شماره 

صفحات  -

تاریخ انتشار 2003